Gradient Descent Methods for Large - Scale Linear Inverse Problems by Chen ( Cool )
نویسندگان
چکیده
of A thesis submitted to the Faculty of Emory College of Emory University in partial fulfillment of the requirements of the degree of Bachelor of Sciences with Honors Department of Mathematics and Computer Science
منابع مشابه
Scaling up Natural Gradient by Sparsely Factorizing the Inverse Fisher Matrix
Second-order optimization methods, such as natural gradient, are difficult to apply to highdimensional problems, because they require approximately solving large linear systems. We present FActorized Natural Gradient (FANG), an approximation to natural gradient descent where the Fisher matrix is approximated with a Gaussian graphical model whose precision matrix can be computed efficiently. We ...
متن کاملPolynomial approximation method for stochastic programming
POLYNOMIAL APPROXIMATION METHOD FOR STOCHASTIC PROGRAMMING Dongxue Ma October 2nd, 2009 Two stage stochastic programming is an important part in the whole area of stochastic programming, and is widely spread in multiple disciplines, such as financial management, risk management, and logistics. The two stage stochastic programming is a natural extension of linear programming by incorporating unc...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملComparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems
We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier–Stokes equation model was used for adjoint parameter estimation. The methods compared consist of three versions of nonlinear conjugate-gradient (CG) method, quasiNewton Broyden–Fletcher–Goldfarb–Shanno (BFGS), the limited-mem...
متن کاملStochastic Gradient Descent Methods for Estimation with Large Data Sets
We develop methods for parameter estimation in settings with large-scale data sets, where traditional methods are no longer tenable. Our methods rely on stochastic approximations, which are computationally efficient as they maintain one iterate as a parameter estimate, and successively update that iterate based on a single data point. When the update is based on a noisy gradient, the stochastic...
متن کامل